Clustering in Weight Space of Feedforward Nets
نویسندگان
چکیده
We study symmetries of feedforward networks in terms of their corresponding groups and nd that these groups naturally act on and partition weight space. We specify an algorithm to generate representative weight vectors in a speciic fundamental domain. The analysis of the metric structure of the fundamental domain enables us to use the location information of weight vector estimates, e. g. for cluster analysis. This can be implemented eeciently even for large networks.
منابع مشابه
Simplifying Neural Nets by Discovering Flat Minima
We present a new algorithm for finding low complexity networks with high generalization capability. The algorithm searches for large connected regions of so-called ''fiat'' minima of the error function. In the weight-space environment of a "flat" minimum, the error remains approximately constant. Using an MDL-based argument, flat minima can be shown to correspond to low expected overfitting. Al...
متن کاملGeneralisation in Cubic Nodes - centres and clustering First, recall the action of TLUs for comparison. The operation of an
This lecture deals with training nets of cubic nodes and introduces another major (quite general) algorithm-Reward Penalty. Insight into how we might train nets of cubic nodes is provided by considering the problems associated with generalisation in these nets. We then go on to consider feedback or recurrent nets from the point of view of their implementing iterated feedforward nets (recall thi...
متن کاملAnalog Neural Nets with Gaussian or Other Common Noise Distribution Cannot Recognize Arbitrary Regular Languages
We consider recurrent analog neural nets where the output of each gate is subject to gaussian noise or any other common noise distribution that is nonzero on a sufficiently large part of the state-space. We show that many regular languages cannot be recognized by networks of this type, and we give a precise characterization of languages that can be recognized. This result implies severe constra...
متن کاملAnalog Neural Nets with Gaussian or Other Common Noise Distributions Cannot Recognize Arbitrary Regular Languages
We consider recurrent analog neural nets where the output of each gate is subject to gaussian noise or any other common noise distribution that is nonzero on a sufficiently large part of the state-space. We show that many regular languages cannot be recognized by networks of this type, and we give a precise characterization of languages that can be recognized. This result implies severe constra...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1996